On Convergence Rates of Linearized Proximal Algorithms for Convex Composite Optimization with Applications

نویسندگان

  • Yaohua Hu
  • Chong Li
  • Xiaoqi Yang
چکیده

In this paper, we investigate a linearized proximal algorithm (LPA) for solving a convex composite optimization problem. Each iteration of the LPA is a proximal minimization on the composition of the outer function and the linearization of the inner function at current iterate. The LPA has the attractive computational advantage that the solution of each subproblem is a singleton, which avoids the difficulty of finding the whole solution set of the subproblem, as in the Gauss-Newton method (GNM), while it still maintains the same local convergence rate as that of the GNM. Under the assumptions of local weak sharp minima of order p (p ∈ [1, 2]) and the quasi-regularity condition, we establish the local superlinear convergence rate for the LPA. We also propose a globalization strategy for the LPA based on the backtracking line-search and an inexact version of the LPA, as well as the superlinear convergence results. We further apply the LPA to solve a feasibility problem, as well as a sensor network localization problem. Our numerical results illustrate that the LPA meets the demand for an efficient and robust algorithm for the sensor network localization problem.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives

In this work we introduce a new optimisation method called SAGA in the spirit of SAG, SDCA, MISO and SVRG, a set of recently proposed incremental gradient algorithms with fast linear convergence rates. SAGA improves on the theory behind SAG and SVRG, with better theoretical convergence rates, and has support for composite objectives where a proximal operator is used on the regulariser. Unlike S...

متن کامل

A General Inertial Proximal Point Algorithm for Mixed Variational Inequality Problem

In this paper, we first propose a general inertial proximal point algorithm (PPA) for the mixed variational inequality (VI) problem. Based on our knowledge, without stronger assumptions, convergence rate result is not known in the literature for inertial type PPAs. Under certain conditions, we are able to establish the global convergence and nonasymptotic O(1/k) convergence rate result (under c...

متن کامل

On the Q-linear Convergence of a Majorized Proximal ADMM for Convex Composite Programming and Its Applications to Regularized Logistic Regression

This paper aims to study the convergence rate of a majorized alternating direction method of multiplier with indefinite proximal terms (iPADMM) for solving linearly constrained convex composite optimization problems. We establish the Q-linear rate convergence theorem for 2-block majorized iPADMM under mild conditions. Based on this result, the convergence rate analysis of symmetric Gaussian-Sei...

متن کامل

Proximal Algorithms in Statistics and Machine Learning

In this paper we develop proximal methods for statistical learning. Proximal point algorithms are useful in statistics and machine learning for obtaining optimization solutions for composite functions. Our approach exploits closedform solutions of proximal operators and envelope representations based on the Moreau, Forward-Backward, Douglas-Rachford and Half-Quadratic envelopes. Envelope repres...

متن کامل

Asynchronous Stochastic Proximal Methods for Nonconvex Nonsmooth Optimization

We study stochastic algorithms for solving non-convex optimization problems with a convex yet possibly non-smooth regularizer, which nd wide applications in many practical machine learning applications. However, compared to asynchronous parallel stochastic gradient descent (AsynSGD), an algorithm targeting smooth optimization, the understanding of the behavior of stochastic algorithms for the n...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • SIAM Journal on Optimization

دوره 26  شماره 

صفحات  -

تاریخ انتشار 2016